International Journal of Data Science and Big Data Analytics
|
Volume 4, Issue 1, May 2024 | |
Research PaperOpenAccess | |
Improved Dimension Reduction for Pattern Classification Using Noise Eigenspace Projection |
|
1NXG Logic LLC, Houston, Texas USA 77030. E-mail: peterson.leif.e@gmail.com
*Corresponding Author | |
Int.J.Data.Sci. & Big Data Anal. 4(1) (2024) 34-51, DOI: https://doi.org/10.51483/IJDSBDA.4.1.2024.34-51 | |
Received: 23/01/2024|Accepted: 18/04/2024|Published: 05/05/2024 |
A novel subspace method is proposed to improve dimension reduction for pattern classification. As a dimension reduction (DR) technique, information-to-noise estimators (INEs) project feature values on class-specific noise eigenvectors derived from the correlation matrix for each class. Parameter and ablation studies indicate that INEs offer improved stability for classification performance. Class prediction results indicate that use of INEs for 14 datasets resulted in a 15.5% increase in accuracy from 84.5% to 97.6% compared with other DR methods. Use of INEs prior to classification analysis also outperformed typical feature selection prior to classification with an 18.3% improvement in mean accuracy (0.97 vs. 0.82). In addition, the standard deviation of accuracy over all datasets was 0.14-0.16 for other DR methods and 0.078 for INE, suggesting a ~48% decrease. We advocate intraclass noise eigenspace projection for pattern classification as an alternative to input feature selection or DR with other methods prior to classification analysis. In conclusion, use of INEs based on the class-specific noise subspace can greatly improve classification performance and model parsimony without use of input feature selection.
Keywords: Classification, Correlation, Eigenspace, Feature extraction, Principal components analysis
Full text | Download |
Copyright © SvedbergOpen. All rights reserved